17 authors, including George RR Martin, have filed a lawsuit against OpenAI for allegedly copying their work to train AI models. Here's what we know.News 

George RR Martin Takes Legal Action Against OpenAI For Utilizing His Writing To Train ChatGPT

George RR Martin, the author of the hugely successful Game of Thrones series, along with other authors, has sued the generative artificial intelligence company OpenAI. They claim that OpenAI has used their books to train ChatGPT and other AI models.

As The Verge reports, 17 popular authors — including David Baldacci, Jonathan Franzen, John Grisham, George R.R. Martin and Jodi Picoult – have filed suit in the Southern District of New York.

According to the complaint filed by the authors, OpenAI was accused of “wholesale copying of the authors’ works without permission or consideration.” Once the data was copied, it was used to train OpenAI’s large language models.

For the uninitiated, Large Language Models or LLMs are the brains of any chatbot like ChatGPT or Google Bard. For example, the free version of ChatGPT uses GPT 3.5, while Google uses its PaLM 2 model for Bard.

The complaint notes that these “authors’ livelihoods are based on the works they create. But defendants’ LLMs jeopardize the ability of fiction writers to make a living because LLMs allow anyone to create—automatically and for free (or very cheaply)—texts that they would otherwise pay writers to create .” It added: “Furthermore, Defendants’ LLMs may spew out derivative works: material based on Plaintiffs’ works that imitates, summarizes, or compares Plaintiffs’ work and harms their markets.”

In response, Sam Altman, CEO of OpenAI (respondent), said that he shares the creators’ concerns and that he wants to ensure that “the creator economy continues to be vibrant is a high priority for OpenAI.”

“OpenAI doesn’t want to replace creators,” Altman added.

How this battle between AI and content creators will play out in the coming days, months and years is anyone’s guess. Generative AI seems to be here to stay, and as LLMs improve, they may need more training material to keep up. Interesting to see how this goes.

Related posts

Leave a Comment